release probability
Export Reviews, Discussions, Author Feedback and Meta-Reviews
First provide a summary of the paper, and then address the following criteria: Quality, clarity, originality and significance. As a major novelty, the authors propose that the stochasticity of synaptic transmission is directly involved in the implementation of stochasticity necessary for Monte Carlo sampling. The neurons used throughout the paper are binary threshold units and not spiking neurons. These binary neurons are able to provide useful insights into how a neuronal network may solve computational problems, but it is important to distinguish between implementations using binary units and spiking neurons. The authors include a short section about spike-based implementation in the appendix, but they do not demonstrate that the spike based implementation is able to perform the same tasks with similar performance.
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks (0.71)
- Information Technology > Artificial Intelligence > Machine Learning > Learning Graphical Models > Undirected Networks > Markov Models (0.51)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Uncertainty (0.48)
Adaptive Synaptic Failure Enables Sampling from Posterior Predictive Distributions in the Brain
McKee, Kevin, Crandell, Ian, Chaudhuri, Rishidev, O'Reilly, Randall
Bayesian interpretations of neural processing require that biological mechanisms represent and operate upon probability distributions in accordance with Bayes' theorem. Many have speculated that synaptic failure constitutes a mechanism of variational, i.e., approximate, Bayesian inference in the brain. Whereas models have previously used synaptic failure to sample over uncertainty in model parameters, we demonstrate that by adapting transmission probabilities to learned network weights, synaptic failure can sample not only over model uncertainty, but complete posterior predictive distributions as well. Our results potentially explain the brain's ability to perform probabilistic searches and to approximate complex integrals. These operations are involved in numerous calculations, including likelihood evaluation and state value estimation for complex planning.
- North America > United States > California > Yolo County > Davis (0.04)
- North America > United States > Virginia (0.04)
- North America > United States > Massachusetts > Middlesex County > Cambridge (0.04)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Uncertainty > Bayesian Inference (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Learning Graphical Models > Directed Networks > Bayesian Learning (0.88)
Locally Learned Synaptic Dropout for Complete Bayesian Inference
McKee, Kevin L., Crandell, Ian C., Chaudhuri, Rishidev, O'Reilly, Randall C.
The Bayesian brain hypothesis postulates that the brain accurately operates on statistical distributions according to Bayes' theorem. The random failure of presynaptic vesicles to release neurotransmitters may allow the brain to sample from posterior distributions of network parameters, interpreted as epistemic uncertainty. It has not been shown previously how random failures might allow networks to sample from observed distributions, also known as aleatoric or residual uncertainty. Sampling from both distributions enables probabilistic inference, efficient search, and creative or generative problem solving. We demonstrate that under a population-code based interpretation of neural activity, both types of distribution can be represented and sampled with synaptic failure alone. We first define a biologically constrained neural network and sampling scheme based on synaptic failure and lateral inhibition. Within this framework, we derive dropout based epistemic uncertainty, then prove an analytic mapping from synaptic efficacy to release probability that allows networks to sample from arbitrary, learned distributions represented by a receiving layer. Second, our result leads to a local learning rule by which synapses adapt their release probabilities. Our result demonstrates complete Bayesian inference, related to the variational learning method of dropout, in a biologically constrained network using only locally-learned synaptic failure rates. Introduction The Bayesian Brain hypothesis has led to a number of important insights about neural coding in the brain (Knill and Pouget, 2004; Friston, 2010, 2012; Pouget et al., 2013; Lee and Mumford, 2003) by characterizing neural representation and processing in terms of formal probabilistic inference and sampling. Furthermore, the introduction of related probabilistic representations and sampling processes in modern deep learning variational models has led to improved performance on a range of different tasks (Zhang et al., 2019; Blei et al., 2017; Kingma and Welling, 2014; Detorakis et al., 2019). The widely-used dropout technique in deep learning can be seen as a form of variational inference and sampling (Srivastava et al., 2014; Gal and Ghahramani, 2016) with direct analogy to the random failure of synapses in the brain. This link has led to biologically-motivated models of variational deep learning that use network weight dropout to simulate synaptic failure (Mostafa and Cauwenberghs, 2018; Wan et al., 2013; Neftci et al., 2016). In this paper, we build on these and other recent findings in machine learning and neurobiology to show how the brain can accurately represent the two primary components of probabilistic inference, distributions of observed data and distributions of unobserved values (such as model parameters), with the single, biologically established mechanism of synaptic failure.
- North America > United States > California > Yolo County > Davis (0.04)
- North America > United States > Virginia (0.04)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Uncertainty > Bayesian Inference (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Learning Graphical Models > Directed Networks > Bayesian Learning (0.88)
Quantal synaptic dilution enhances sparse encoding and dropout regularisation in deep networks
Dropout is a technique that silences the activity of units stochastically while training deep networks to reduce overfitting. Here we introduce Quantal Synaptic Dilution (QSD), a biologically plausible model of dropout regularisation based on the quantal properties of neuronal synapses, that incorporates heterogeneities in response magnitudes and release probabilities for vesicular quanta. QSD outperforms standard dropout in ReLU multilayer perceptrons, with enhanced sparse encoding at test time when dropout masks are replaced with identity functions, without shifts in trainable weight or bias distributions. For convolutional networks, the method also improves generalisation in computer vision tasks with and without inclusion of additional forms of regularisation. QSD also outperforms standard dropout in recurrent networks for language modelling and sentiment analysis. An advantage of QSD over many variations of dropout is that it can be implemented generally in all conventional deep networks where standard dropout is applicable.
- Europe > United Kingdom (0.04)
- Asia (0.04)
Dynamic Stochastic Synapses as Computational Units
Maass, Wolfgang, Zador, Anthony M.
In most neural network models, synapses are treated as static weights that change only on the slow time scales of learning. In fact, however, synapses are highly dynamic, and show use-dependent plasticity over a wide range of time scales. Moreover, synaptic transmission is an inherently stochastic process: a spike arriving at a presynaptic terminal triggers release of a vesicle of neurotransmitter from a release site with a probability that can be much less than one. Changes in release probability represent one of the main mechanisms by which synaptic efficacy is modulated in neural circuits. We propose and investigate a simple model for dynamic stochastic synapses that can easily be integrated into common models for neural computation. We show through computer simulations and rigorous theoretical analysis that this model for a dynamic stochastic synapse increases computational power in a nontrivial way. Our results may have implications for the processing oftime-varying signals by both biological and artificial neural networks. A synapse 8 carries out computations on spike trains, more precisely on trains of spikes from the presynaptic neuron. Each spike from the presynaptic neuron mayor may not trigger the release of a neurotransmitter-filled vesicle at the synapse.
- Europe > Austria > Styria > Graz (0.05)
- North America > United States > New York (0.04)
- North America > United States > California > San Diego County > La Jolla (0.04)
Dynamic Stochastic Synapses as Computational Units
Maass, Wolfgang, Zador, Anthony M.
In most neural network models, synapses are treated as static weights that change only on the slow time scales of learning. In fact, however, synapses are highly dynamic, and show use-dependent plasticity over a wide range of time scales. Moreover, synaptic transmission is an inherently stochastic process: a spike arriving at a presynaptic terminal triggers release of a vesicle of neurotransmitter from a release site with a probability that can be much less than one. Changes in release probability represent one of the main mechanisms by which synaptic efficacy is modulated in neural circuits. We propose and investigate a simple model for dynamic stochastic synapses that can easily be integrated into common models for neural computation. We show through computer simulations and rigorous theoretical analysis that this model for a dynamic stochastic synapse increases computational power in a nontrivial way. Our results may have implications for the processing of time-varying signals by both biological and artificial neural networks. A synapse 8 carries out computations on spike trains, more precisely on trains of spikes from the presynaptic neuron. Each spike from the presynaptic neuron mayor may not trigger the release of a neurotransmitter-filled vesicle at the synapse.
- Europe > Austria > Styria > Graz (0.05)
- North America > United States > New York (0.04)
- North America > United States > California > San Diego County > La Jolla (0.04)
Dynamic Stochastic Synapses as Computational Units
Maass, Wolfgang, Zador, Anthony M.
In most neural network models, synapses are treated as static weights that change only on the slow time scales of learning. In fact, however, synapses are highly dynamic, and show use-dependent plasticity over a wide range of time scales. Moreover, synaptic transmission is an inherently stochastic process: a spike arriving at a presynaptic terminal triggers release of a vesicle of neurotransmitter from a release site with a probability that can be much less than one. Changes in release probability represent one of the main mechanisms by which synaptic efficacy is modulated in neural circuits. We propose and investigate a simple model for dynamic stochastic synapses that can easily be integrated into common models for neural computation. We show through computer simulations and rigorous theoretical analysis that this model for a dynamic stochastic synapse increases computational power in a nontrivial way. Our results may have implications for the processing of time-varying signals by both biological and artificial neural networks. A synapse 8 carries out computations on spike trains, more precisely on trains of spikes from the presynaptic neuron. Each spike from the presynaptic neuron mayor may not trigger the release of a neurotransmitter-filled vesicle at the synapse.
- Europe > Austria > Styria > Graz (0.05)
- North America > United States > New York (0.04)
- North America > United States > California > San Diego County > La Jolla (0.04)